Convergence in neural networks with interneuronal transmission delays - Neural Networks, 1994. IEEE World Congress on Computational Intelligence., 1994 IEEE International
نویسنده
چکیده
A neural network is a network of interconnected elementary units which have limited characteristic properties of real (or biological) neurons. Each unit is capable of receiving many inputs, some of which can activate the unit while other inputs can inhibit the activities of the unit. The neuron-like elementary unit computes a weighted sum of the input,s it receives and fires (or produces) a single response and sends the response down along its ‘axon’, when the weighted sum exceeds a certain threshold. Interst in the dynamical characteristics of models of neural networks has been steadily increasing during the last 40 years. In this article we consider the following model of a neural network with n units:
منابع مشابه
Delay-independent stability in bidirectional associative memory networks
It is shown that if the neuronal gains are small compared with the synaptic connection weights, then a bidirectional associative memory network with axonal signal transmission delays converges to the equilibria associated with exogenous inputs to the network. Both discrete and continuously distributed delays are considered; the asymptotic stability is global in the state space of neuronal activ...
متن کاملEditorial: Welcome To The IEEE Neural Networks Society
I WANT towelcomeyou toournewly formedsociety.On February 17, 2002, the IEEE Neural Networks Council (NNC), publisher of the IEEE TRANSACTIONS ON NEURAL NETWORKS (TNN), the IEEE TRANSACTIONS ON FUZZY SYSTEMS (TFS), and the IEEE TRANSACTIONS ON EVOLUTIONARY COMPUTATION (TEC), became the IEEE Neural Networks Society (NNS). This accomplishment was made possible by the relentless efforts of our ExCo...
متن کاملImplementing radial basis functions using bump-resistor networks - Neural Networks, 1994. IEEE World Congress on Computational Intelligence., 1994 IEEE International
Radial Basis Function (RBF) networks provide a powerful learning architecture for neural networks [6]. We have implemented a RBF network in analog VLSI using the concept of bump-resistors. A bump-resistor is a nonlinear resistor whose conductance is a Gaussian-like function of the difference of two other voltages. The width of the Gaussian basis functions may be continuously varied so that the ...
متن کاملA parallel genetic/neural network learning algorithm for MIMD shared memory machines
A new algorithm is presented for training of multilayer feedforward neural networks by integrating a genetic algorithm with an adaptive conjugate gradient neural network learning algorithm. The parallel hybrid learning algorithm has been implemented in C on an MIMD shared memory machine (Cray Y-MP8/864 supercomputer). It has been applied to two different domains, engineering design and image re...
متن کاملFebruary 2010 | Ieee Computational Intelligence Magazine 5 Spotlight Publication Publication Publication Cis Publication Spotlight Ieee Transactions on Neural Networks Ieee Transactions on Fuzzy Systems
Digital Object Identifier: 10.1109/ TNN.2009.2025946 “Convergence analysis of the online BP training algorithm was proved using two fundamental theorems: One theorem claims that under mild conditions, the gradient sequence of the error function will converge to zero (the weak convergence), and another theorem concludes the convergence of the weight sequence defined by the procedure to a fixed v...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2004